712 research outputs found

    Lease maturity and initial rent: is there a term structure for UK commercial property leases?

    Get PDF
    This paper investigates the relationship between lease maturity and rent in commercial property. Over the last decade market-led changes to lease structures, the threat of government intervention and the associated emergence of the Codes of Practice for commercial leases have stimulated growing interest in pricing of commercial property leases. Seminal work by Grenadier (1995) derived a set of hypotheses about the pricing of different lease lengths in different market conditions. Whilst there is a compelling theoretical case for and a strong intuitive expectation of differential pricing of different lease maturities, to date the empirical evidence is inconclusive. Two Swedish studies have found mixed results (Gunnelin and Soderbergh 2003 and Englund et al 2003). In only half the cases is the null hypothesis that lease length has no effect rejected. In the UK, Crosby et al (2003) report counterintuitive results. In some markets, they find that short lease terms are associated with low rents, whilst in others they are associated with high rents. Drawing upon a substantial database of commercial lettings in central London (West End and City of London) over the last decade, we investigate the relationship between rent and lease maturity. In particular, we test whether a building quality variable omitted in previous studies provides empirical results that are more consistent with the theoretical and intuitive a priori expectations. It is found that initial leases rates are upward sloping with the lease term and that this relationship is constant over time

    A stochastic model for the evolution of the web allowing link deletion

    Get PDF
    Recently several authors have proposed stochastic evolutionary models for the growth of the web graph and other networks that give rise to power-law distributions. These models are based on the notion of preferential attachment leading to the ``rich get richer'' phenomenon. We present a generalisation of the basic model by allowing deletion of individual links and show that it also gives rise to a power-law distribution. We derive the mean-field equations for this stochastic model and show that by examining a snapshot of the distribution at the steady state of the model, we are able to tell whether any link deletion has taken place and estimate the link deletion probability. Our model enables us to gain some insight into the distribution of inlinks in the web graph, in particular it suggests a power-law exponent of approximately 2.15 rather than the widely published exponent of 2.1

    A stochastic evolutionary model for capturing human dynamics

    Get PDF
    The recent interest in human dynamics has led researchers to investigate the stochastic processes that explain human behaviour in various contexts. Here we propose a generative model to capture the dynamics of survival analysis, traditionally employed in clinical trials and reliability analysis in engineering. We derive a general solution for the model in the form of a product, and then a continuous approximation to the solution via the renewal equation describing age-structured population dynamics. This enables us to model a wide range of survival distributions, according to the choice of the mortality distribution. We provide empirical evidence for the validity of the model from a longitudinal data set of popular search engine queries over 114 months, showing that the survival function of these queries is closely matched by the solution for our model with power-law mortality

    Developing Prognosis Tools to Identify Learning Difficulties in Children Using Machine Learning Technologies

    Get PDF
    The Mental Attributes Profiling System was developed in 2002 (Laouris and Makris, Proceedings of multilingual & cross-cultural perspectives on Dyslexia, Omni Shoreham Hotel, Washington, D.C, 2002), to provide a multimodal evaluation of the learning potential and abilities of young childrenā€™s brains. The method is based on the assessment of non-verbal abilities using video-like interfaces and was compared to more established methodologies in (Papadopoulos, Laouris, Makris, Proceedings of IDA 54th annual conference, San Diego, 2003), such as the Wechsler Intelligence Scale for Children (Watkins etĀ al., Psychol Sch 34(4):309ā€“319, 1997). To do so, various tests have been applied to a population of 134 children aged 7ā€“12Ā years old. This paper addresses the issue of identifying a minimal set of variables that are able to accurately predict the learning abilities of a given child. The use of Machine Learning technologies to do this provides the advantage of making no prior assumptions about the nature of the data and eliminating natural bias associated with data processing carried out by humans. Kohonenā€™s Self Organising Maps (Kohonen, Biol Cybern 43:59ā€“69, 1982) algorithm is able to split a population into groups based on large and complex sets of observations. Once the population is split, the individual groups can then be probed for their defining characteristics providing insight into the rationale of the split. The characteristics identified form the basis of classification systems that are able to accurately predict which group an individual will belong to, using only a small subset of the tests available. The specifics of this methodology are detailed herein, and the resulting classification systems provide an effective tool to prognose the learning abilities of new subjects

    A new mask-based objective measure for predicting the intelligibility of binary masked speech

    Get PDF
    ABSTRACT Mask-based objective speech-intelligibility measures have been successfully proposed for evaluating the performance of binary masking algorithms. These objective measures were computed directly by comparing the estimated binary mask against the ground truth ideal binary mask (IdBM). Most of these objective measures, however, assign equal weight to all time-frequency (T-F) units. In this study, we propose to improve the existing mask-based objective measures by weighting each T-F unit according to its target or masker loudness. The proposed objective measure shows significantly better performance than two other existing mask-based objective measures

    Towards a comprehensive evaluation of ultrasound speckle reduction

    Get PDF
    Over the last three decades, several despeckling filters have been developed to reduce the speckle noise inherently present in ultrasound images without losing the diagnostic information. In this paper, a new intensity and feature preservation evaluation metric for full speckle reduction evaluation is proposed based contrast and feature similarities. A comparison of the despeckling methods is done, using quality metrics and visual interpretation of images profiles to evaluate their performance and show the benefits each one can contribute to noise reduction and feature preservation. To test the methods, noise-free images and simulated B-mode ultrasound images are used. This way, the despeckling techniques can be compared using numeric metrics, taking the noise-free image as a reference. In this study, a total of seventeen different speckle reduction algorithms have been documented based on adaptive filtering, diffusion filtering and wavelet filtering, with sixteen qualitative metrics estimation.info:eu-repo/semantics/publishedVersio

    Use cases, best practice and reporting standards for metabolomics in regulatory toxicology

    Get PDF
    Metabolomics is a widely used technology in academic research, yet its application to regulatory science has been limited. The most commonly cited barrier to its translation is lack of performance and reporting standards. The MEtabolomics standaRds Initiative in Toxicology (MERIT) project brings together international experts from multiple sectors to address this need. Here, we identify the most relevant applications for metabolomics in regulatory toxicology and develop best practice guidelines, performance and reporting standards for acquiring and analysing untargeted metabolomics and targeted metabolite data. We recommend that these guidelines are evaluated and implemented for several regulatory use cases
    • ā€¦
    corecore